Modified Simplex Imbeddings Method in Convex Non-differentiable Optimization
نویسنده
چکیده
We consider a new interpretation of the modified simplex imbeddings method. The main construction of this method is a simplex which contains a solution of convex non-differentiable problem. A cutting plane drawn through the simplex center is used to delete a part of the simplex without the solution. The most interesting feature of this method is the convergence estimation which depends only on the quantity of simplex vertices that are cut off by the cutting hyperplane. The more vertices are cut off by the cutting hyperplane, the higher rate of method convergence. We consider the special technique of constructing the simplex containing the set of points defining the truncated simplex. Such approach let us attribute the problem of constructing the minimal volume simplex to structural optimization problems that have quite efficient interior-point schemes for finding the optimal solution. The results of numerical experiment are also given in this paper.
منابع مشابه
Augmented Downhill Simplex a Modified Heuristic Optimization Method
Augmented Downhill Simplex Method (ADSM) is introduced here, that is a heuristic combination of Downhill Simplex Method (DSM) with Random Search algorithm. In fact, DSM is an interpretable nonlinear local optimization method. However, it is a local exploitation algorithm; so, it can be trapped in a local minimum. In contrast, random search is a global exploration, but less efficient. Here, rand...
متن کاملConvergence of the Restricted Nelder-Mead Algorithm in Two Dimensions
The Nelder–Mead algorithm, a longstanding direct search method for unconstrained optimization published in 1965, is designed to minimize a scalar-valued function f of n real variables using only function values, without any derivative information. Each Nelder–Mead iteration is associated with a nondegenerate simplex defined by n+ 1 vertices and their function values; a typical iteration produce...
متن کاملA restarted and modified simplex search for unconstrained optimization
We propose in this paper a simple but efficient modification of the well-known Nelder-Mead (NM) simplex search method for unconstrained optimization. Instead of moving all n simplex vertices at once in the direction of the best vertex, our ”shrink” step moves them in the same direction but one by one until an improvement is obtained. In addition, for solving non-convex problems, we simply resta...
متن کاملQuadratic cost flow and the conjugate gradient method
By introducing quadratic penalty terms, a convex nonseparable quadratic network program can be reduced to an unconstrained optimization problem whose objective function is a piecewise quadratic and continuously differentiable function. A conjugate gradient method is applied to the reduced problem and its convergence is proved. The computation exploits the special network data structures origina...
متن کاملAn algorithm for approximating nondominated points of convex multiobjective optimization problems
In this paper, we present an algorithm for generating approximate nondominated points of a multiobjective optimization problem (MOP), where the constraints and the objective functions are convex. We provide outer and inner approximations of nondominated points and prove that inner approximations provide a set of approximate weakly nondominated points. The proposed algorithm can be appl...
متن کامل